127 research outputs found

    Specifying OCL Constraints on Process Instantiations

    Get PDF
    Due to the variety of concerns affecting software development in large organisations, processes have to be adapted to project specific needs to be effectively applicable in individual projects. We describe a project aiming to provide tool support for this individualised instantiation of reference processes, based on an OCL-based specification of instantiation operations. The aim is not only to execute instantiation decisions made by humans but to automatically ensure correctness of the resulting process, potentially resulting in followup actions being executed or suggested

    Towards a linked information architecture for integrated law enforcement

    Get PDF
    PonĂšncia presentada al Workshop on Linked Democracy: Artificial Intelligence for Democratic Innovation co-located with the 26th International Joint Conference on Artificial Intelligence (IJCAI 2017) celebrat el 19 d'agost de 2017 a Melbourne, AustraliaLaw enforcement agencies are facing an ever-increasing flood of data to be acquired, stored, assessed and used. Automation and advanced data analy-sis capabilities are required to supersede traditional manual work processes and legacy information silos by automatically acquiring information from a range of sources, analyzing it in the context of on-going investigations, and linking it to other pieces of knowledge pertaining to the investigation. This paper outlines a modular architecture for management of linked data in the law enforcement domain and discusses legal and policy issues related to workflows and infor-mation sharing in this context

    Bitemporal Support for Business Process Contingency Management

    Get PDF
    Modern organisations are increasingly moving from traditional monolithic business systems to environments where more and more tasks are outsourced to third party providers. Therefore, processes must operate in an open and dynamic environment in which the management of time plays a crucial role. Handling time, however, remains a challenging issue yet to be fully addressed. Traditional processing systems only consider business events in a single time dimension, but are unable to handle bitemporal events: events in two time dimensions. Recently, backend systems have started to provide increased support for handling bitemporal events, but these enhanced capabilities have not been carried through to business process management systems. In this paper, we consider the possible relationships that exist between bitemporal properties of events and we show how these relationships affect a business process. In addition, we demonstrate how bitemporal events can be handled to prevent certain undesired effects on the business process

    Spent convictions and the architecture for establishing legal semantic workflows

    Get PDF
    This research was partially funded by the Data to Decisions Cooperative Research Centre (D2D CRC, Australia), and Meta-Rule of Law (DER2016- 78108-P, Spain)Operating within the Data to Decision Cooperative Research Centre (D2D CRC), the authors are currently involved in the Integrated Law Enforcement program and the Compliance through Design project. These have the goal of developing a federated data platform for law enforcement agencies that will enable the execution of integrated analytics on data accessed from different external and internal sources, thereby providing effective support to an investigator or analyst working to evaluate evidence and manage lines of inquiries in an investigation. Technical solutions should also operate ethically, in compliance with the law and subject to good governance principles. This paper is focused on the Australian spent convictions scheme, which provide use cases to test the platform

    Governance in the age of social machines: the web observatory

    Get PDF
    The World Wide Web has provided unprecedented access to information; as humans and machines increasingly interact with it they provide more and more data. The challenge is how to analyse and interpret this data within the context that it was created, and to present it in a way that both researchers and practitioners can more easily make sense of. The first step is to have access to open and interoperable data sets, which Governments around the world are increasingly subscribing to. But having ‘open’ data is just the beginning and does not necessarily lead to better decision making or policy development. This is because data do not provide the answers – they need to be analysed, interpreted and understood within the context of their creation, and the business imperative of the organisation using them. The major corporate entities, such as Google, Amazon, Microsoft, Apple and Facebook, have the capabilities to do this, but are driven by their own commercial imperatives, and their data are largely siloed and held within ‘walled gardens’ of information. All too often governments and non-profit groups lack these capabilities, and are driven by very different mandates. In addition they have far more complex community relationships, and must abide by regulatory constraints which dictate how they can use the data they hold. As such they struggle to maximise the value of this emerging ‘digital currency’ and are therefore largely beholden to commercial vendors. What has emerged is a public-private data ecosystem that has huge policy implications (including the twin challenges of privacy and security). Many within the public sector lack the skills to address these challenges because they lack the literacy required within the digital context. This project seeks to address some of these problems by bringing together a safe and secure Australian-based data platform (facilitating the sharing of data, analytics and visualisation) with policy analysis and governance expertise in order to create a collaborative working model of a ‘Government Web Observatory’. This neutral space, hosted by an Australian university, can serve as a powerful complement to existing Open Data initiatives in Australia, and enable research and education to combine to support the development of a more digitally literate public service. The project aims to explore where, and in which contexts, people, things, data and the Internet meet and result in evolving observable phenomena which can inform better government policy development and service delivery.&nbsp

    (Re)configuration based on model generation

    Full text link
    Reconfiguration is an important activity for companies selling configurable products or services which have a long life time. However, identification of a set of required changes in a legacy configuration is a hard problem, since even small changes in the requirements might imply significant modifications. In this paper we show a solution based on answer set programming, which is a logic-based knowledge representation formalism well suited for a compact description of (re)configuration problems. Its applicability is demonstrated on simple abstractions of several real-world scenarios. The evaluation of our solution on a set of benchmark instances derived from commercial (re)configuration problems shows its practical applicability.Comment: In Proceedings LoCoCo 2011, arXiv:1108.609
    • 

    corecore